Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Background. Vaccine misinformation has been widely spread on social media, but attempts to combat it have not taken advantage of the attributes of social media platforms for health education. Methods. The objective was to test the efficacy of moderated social media discussions about COVID-19 vaccines in private Facebook groups. Unvaccinated U.S. adults were recruited using Amazon’s Mechanical Turk and randomized. In the intervention group, moderators posted two informational posts per day for 4 weeks and engaged in relationship-building interactions with group members. In the control group, participants received a referral to Facebook’s COVID-19 Information Center. Follow-up surveys with participants (N = 478) were conducted 6 weeks post-enrollment. Results. At 6 weeks follow-up, no differences were found in vaccination rates. Intervention participants were more likely to show improvements in their COVID-19 vaccination intentions (vs. stay same or decline) compared with control (p = .03). They also improved more in their intentions to encourage others to vaccinate for COVID-19. There were no differences in COVID-19 vaccine confidence or intentions between groups. General vaccine and responsibility to vaccinate were higher in the intervention compared with control. Most participants in the intervention group reported high levels of satisfaction. Participants engaged with content (e.g., commented, reacted) 11.8 times on average over the course of 4 weeks. Conclusions. Engaging with vaccine-hesitant individuals in private Facebook groups improved some COVID-19 vaccine-related beliefs and represents a promising strategy.more » « less
-
Background: Distrust and partisan identity are theorized to undermine health communications. We examined the role of these factors on the efficacy of discussion groups intended to promote vaccine uptake. Method: W e analyzed survey data from unvaccinated Facebook users (N = 371) living in the US between January and April 2022. Participants were randomly assigned to Facebook discussion groups (intervention) or referred to Facebook ’s COVID-19 Information Center (control). We used Analysis of Covariance to test if the intervention was more effective at changing vaccination intentions and beliefs compared to the control in subgroups based on participants ’ p artisan identity, political views, and information trust views. Results: W e found a significant interaction between the intervention and trust in public health institutions (PHIs) for improving intentions to vaccinate (P = .04), intentions to encourage others to vaccinate ( P = .03), and vaccine confidence beliefs ( P = .01). Among participants who trusted PHIs, those in the intervention had higher posttest intentions to vaccinate ( P = .008) and intentions to encourage others to vaccinate ( P = .002) compared to the control. Among non-conservatives, participants in the intervention had higher posttest intentions to vaccinate ( P = .048). The intervention was more effective at improving intentions to encourage others to vaccinate within the subgroups of Republicans ( P = .03), conservatives (P = .02), and participants who distrusted government ( P = .02). Conclusions: Facebook discussion groups were more effective for people who trusted PHIs and non-conservatives. Health communicators may need to segment health messaging and develop strategies around trust views.more » « less
-
Online misinformation promotes distrust in science, undermines public health, and may drive civil unrest. During the coronavirus disease 2019 pandemic, Facebook—the world’s largest social media company—began to remove vaccine misinformation as a matter of policy. We evaluated the efficacy of these policies using a comparative interrupted time-series design. We found that Facebook removed some antivaccine content, but we did not observe decreases in overall engagement with antivaccine content. Provaccine content was also removed, and antivaccine content became more misinformative, more politically polarized, and more likely to be seen in users’ newsfeeds. We explain these findings as a consequence of Facebook’s system architecture, which provides substantial flexibility to motivated users who wish to disseminate misinformation through multiple channels. Facebook’s architecture may therefore afford antivaccine content producers several means to circumvent the intent of misinformation removal policies.more » « less
An official website of the United States government
